61 research outputs found

    Distributed affective space represents multiple emotion categories across the human brain

    Get PDF
    The functional organization of human emotion systems as well as their neuroanatomical basis and segregation in the brain remains unresolved. Here, we used pattern classification and hierarchical clustering to characterize the organization of a wide array of emotion categories in the human brain. We induced 14 emotions (6 'basic', e.g. fear and anger; and 8 'non-basic', e.g. shame and gratitude) and a neutral state using guided mental imagery while participants' brain activity was measured with functional magnetic resonance imaging (fMRI). Twelve out of 14 emotions could be reliably classified from the haemodynamic signals. All emotions engaged a multitude of brain areas, primarily in midline cortices including anterior and posterior cingulate gyri and precuneus, in subcortical regions, and in motor regions including cerebellum and premotor cortex. Similarity of subjective emotional experiences was associated with similarity of the corresponding neural activation patterns. We conclude that different basic and non-basic emotions have distinguishable neural bases characterized by specific, distributed activation patterns in widespread cortical and subcortical circuits. Regionally differentiated engagement of these circuits defines the unique neural activity pattern and the corresponding subjective feeling associated with each emotion

    Involuntary Monitoring of Sound Signals in Noise Is Reflected in the Human Auditory Evoked N1m Response

    Get PDF
    Constant sound sequencing as operationalized by repeated stimulation with tones of the same frequency has multiple effects. On the one hand, it activates mechanisms of habituation and refractoriness, which are reflected in the decrease of response amplitude of evoked responses. On the other hand, the constant sequencing acts as spectral cueing, resulting in tones being detected faster and more accurately. With the present study, by means of magnetoencephalography, we investigated the impact of repeated tone stimulation on the N1m auditory evoked fields, while listeners were distracted from the test sounds. We stimulated subjects with trains of either four tones of the same frequency, or with trains of randomly assigned frequencies. The trains were presented either in a silent or in a noisy background. In silence, the patterns of source strength decline originating from repeated stimulation suggested both, refractoriness as well as habituation as underlying mechanisms. In noise, in contrast, there was no indication of source strength decline. Furthermore, we found facilitating effects of constant sequencing regarding the detection of the single tones as indexed by a shortening of N1m latency. We interpret our findings as a correlate of a bottom-up mechanism that is constantly monitoring the incoming auditory information, even when voluntary attention is directed to a different modality

    Stimulus-Related Independent Component and Voxel-Wise Analysis of Human Brain Activity during Free Viewing of a Feature Film

    Get PDF
    Understanding how the brain processes stimuli in a rich natural environment is a fundamental goal of neuroscience. Here, we showed a feature film to 10 healthy volunteers during functional magnetic resonance imaging (fMRI) of hemodynamic brain activity. We then annotated auditory and visual features of the motion picture to inform analysis of the hemodynamic data. The annotations were fitted to both voxel-wise data and brain network time courses extracted by independent component analysis (ICA). Auditory annotations correlated with two independent components (IC) disclosing two functional networks, one responding to variety of auditory stimulation and another responding preferentially to speech but parts of the network also responding to non-verbal communication. Visual feature annotations correlated with four ICs delineating visual areas according to their sensitivity to different visual stimulus features. In comparison, a separate voxel-wise general linear model based analysis disclosed brain areas preferentially responding to sound energy, speech, music, visual contrast edges, body motion and hand motion which largely overlapped the results revealed by ICA. Differences between the results of IC- and voxel-based analyses demonstrate that thorough analysis of voxel time courses is important for understanding the activity of specific sub-areas of the functional networks, while ICA is a valuable tool for revealing novel information about functional connectivity which need not be explained by the predefined model. Our results encourage the use of naturalistic stimuli and tasks in cognitive neuroimaging to study how the brain processes stimuli in rich natural environments

    GABAA-Mediated Inhibition Modulates Stimulus-Specific Adaptation in the Inferior Colliculus

    Get PDF
    The ability to detect novel sounds in a complex acoustic context is crucial for survival. Neurons from midbrain through cortical levels adapt to repetitive stimuli, while maintaining responsiveness to rare stimuli, a phenomenon called stimulus-specific adaptation (SSA). The site of origin and mechanism of SSA are currently unknown. We used microiontophoretic application of gabazine to examine the role of GABAA-mediated inhibition in SSA in the inferior colliculus, the midbrain center for auditory processing. We found that gabazine slowed down the process of adaptation to high probability stimuli but did not abolish it, with response magnitude and latency still depending on the probability of the stimulus. Blocking GABAA receptors increased the firing rate to high and low probability stimuli, but did not completely equalize the responses. Together, these findings suggest that GABAA-mediated inhibition acts as a gain control mechanism that enhances SSA by modifying the responsiveness of the neuron

    Selective Attention Increases Both Gain and Feature Selectivity of the Human Auditory Cortex

    Get PDF
    Background. An experienced car mechanic can often deduce what’s wrong with a car by carefully listening to the sound of the ailing engine, despite the presence of multiple sources of noise. Indeed, the ability to select task-relevant sounds for awareness, whilst ignoring irrelevant ones, constitutes one of the most fundamental of human faculties, but the underlying neural mechanisms have remained elusive. While most of the literature explains the neural basis of selective attention by means of an increase in neural gain, a number of papers propose enhancement in neural selectivity as an alternative or a complementary mechanism. Methodology/Principal Findings. Here, to address the question whether pure gain increase alone can explain auditory selective attention in humans, we quantified the auditory cortex frequency selectivity in 20 healthy subjects by masking 1000-Hz tones by continuous noise masker with parametrically varying frequency notches around the tone frequency (i.e., a notched-noise masker). The task of the subjects was, in different conditions, to selectively attend to either occasionally occurring slight increments in tone frequency (1020 Hz), tones of slightly longer duration, or ignore the sounds. In line with previous studies, in the ignore condition, the global field power (GFP) of event-related brain responses at 100 ms from the stimulus onset to the 1000-Hz tones was suppressed as a function of the narrowing of the notch width. During the selective attention conditions, the suppressant effect of the noise notch width on GFP was decreased, but as a function significantly different from a multiplicative one expected on the basis of simple gain model of selective attention. Conclusions/Significance. Our results suggest that auditory selective attention in humans cannot be explained by a gai

    Neural Processing of Short-Term Recurrence in Songbird Vocal Communication

    Get PDF
    BACKGROUND: Many situations involving animal communication are dominated by recurring, stereotyped signals. How do receivers optimally distinguish between frequently recurring signals and novel ones? Cortical auditory systems are known to be pre-attentively sensitive to short-term delivery statistics of artificial stimuli, but it is unknown if this phenomenon extends to the level of behaviorally relevant delivery patterns, such as those used during communication. METHODOLOGY/PRINCIPAL FINDINGS: We recorded and analyzed complete auditory scenes of spontaneously communicating zebra finch (Taeniopygia guttata) pairs over a week-long period, and show that they can produce tens of thousands of short-range contact calls per day. Individual calls recur at time scales (median interval 1.5 s) matching those at which mammalian sensory systems are sensitive to recent stimulus history. Next, we presented to anesthetized birds sequences of frequently recurring calls interspersed with rare ones, and recorded, in parallel, action and local field potential responses in the medio-caudal auditory forebrain at 32 unique sites. Variation in call recurrence rate over natural ranges leads to widespread and significant modulation in strength of neural responses. Such modulation is highly call-specific in secondary auditory areas, but not in the main thalamo-recipient, primary auditory area. CONCLUSIONS/SIGNIFICANCE: Our results support the hypothesis that pre-attentive neural sensitivity to short-term stimulus recurrence is involved in the analysis of auditory scenes at the level of delivery patterns of meaningful sounds. This may enable birds to efficiently and automatically distinguish frequently recurring vocalizations from other events in their auditory scene

    Dissociation between the Activity of the Right Middle Frontal Gyrus and the Middle Temporal Gyrus in Processing Semantic Priming

    Get PDF
    The aim of this event-related functional magnetic resonance imaging (fMRI) study was to test whether the right middle frontal gyrus (MFG) and middle temporal gyrus (MTG) would show differential sensitivity to the effect of prime-target association strength on repetition priming. In the experimental condition (RP), the target occurred after repetitive presentation of the prime within an oddball design. In the control condition (CTR), the target followed a single presentation of the prime with equal probability of the target as in RP. To manipulate semantic overlap between the prime and the target both conditions (RP and CTR) employed either the onomatopoeia “oink” as the prime and the referent “pig” as the target (OP) or vice-versa (PO) since semantic overlap was previously shown to be greater in OP. The results showed that the left MTG was sensitive to release of adaptation while both the right MTG and MFG were sensitive to sequence regularity extraction and its verification. However, dissociated activity between OP and PO was revealed in RP only in the right MFG. Specifically, target “pig” (OP) and the physically equivalent target in CTR elicited comparable deactivations whereas target “oink” (PO) elicited less inhibited response in RP than in CTR. This interaction in the right MFG was explained by integrating these effects into a competition model between perceptual and conceptual effects in priming processing

    Hemodynamic responses in human multisensory and auditory association cortex to purely visual stimulation

    Get PDF
    BACKGROUND: Recent findings of a tight coupling between visual and auditory association cortices during multisensory perception in monkeys and humans raise the question whether consistent paired presentation of simple visual and auditory stimuli prompts conditioned responses in unimodal auditory regions or multimodal association cortex once visual stimuli are presented in isolation in a post-conditioning run. To address this issue fifteen healthy participants partook in a "silent" sparse temporal event-related fMRI study. In the first (visual control) habituation phase they were presented with briefly red flashing visual stimuli. In the second (auditory control) habituation phase they heard brief telephone ringing. In the third (conditioning) phase we coincidently presented the visual stimulus (CS) paired with the auditory stimulus (UCS). In the fourth phase participants either viewed flashes paired with the auditory stimulus (maintenance, CS-) or viewed the visual stimulus in isolation (extinction, CS+) according to a 5:10 partial reinforcement schedule. The participants had no other task than attending to the stimuli and indicating the end of each trial by pressing a button. RESULTS: During unpaired visual presentations (preceding and following the paired presentation) we observed significant brain responses beyond primary visual cortex in the bilateral posterior auditory association cortex (planum temporale, planum parietale) and in the right superior temporal sulcus whereas the primary auditory regions were not involved. By contrast, the activity in auditory core regions was markedly larger when participants were presented with auditory stimuli. CONCLUSION: These results demonstrate involvement of multisensory and auditory association areas in perception of unimodal visual stimulation which may reflect the instantaneous forming of multisensory associations and cannot be attributed to sensation of an auditory event. More importantly, we are able to show that brain responses in multisensory cortices do not necessarily emerge from associative learning but even occur spontaneously to simple visual stimulation

    The modulation of auditory novelty processing by working memory load in school age children and adults: a combined behavioral and event-related potential study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>We investigated the processing of task-irrelevant and unexpected novel sounds and its modulation by working-memory load in children aged 9-10 and in adults. Environmental sounds (novels) were embedded amongst frequently presented standard sounds in an auditory-visual distraction paradigm. Each sound was followed by a visual target. In two conditions, participants evaluated the position of a visual stimulus (0-back, low load) or compared the position of the current stimulus with the one two trials before (2-back, high load). Processing of novel sounds were measured with reaction times, hit rates and the auditory event-related brain potentials (ERPs) Mismatch Negativity (MMN), P3a, Reorienting Negativity (RON) and visual P3b.</p> <p>Results</p> <p>In both memory load conditions novels impaired task performance in adults whereas they improved performance in children. Auditory ERPs reflect age-related differences in the time-window of the MMN as children showed a positive ERP deflection to novels whereas adults lack an MMN. The attention switch towards the task irrelevant novel (reflected by P3a) was comparable between the age groups. Adults showed more efficient reallocation of attention (reflected by RON) under load condition than children. Finally, the P3b elicited by the visual target stimuli was reduced in both age groups when the preceding sound was a novel.</p> <p>Conclusion</p> <p>Our results give new insights in the development of novelty processing as they (1) reveal that task-irrelevant novel sounds can result in contrary effects on the performance in a visual primary task in children and adults, (2) show a positive ERP deflection to novels rather than an MMN in children, and (3) reveal effects of auditory novels on visual target processing.</p
    • 

    corecore